Advertisement
  • National News
  • State News
  • Contact Us
  • Subscribe
  • My Account
Subscribe For $2.50/month
Print Editions
News Journal
  • News
    • Local
    • Sports
    • School
    • Courthouse
      • Deeds
  • Obituaries
  • Opinion
  • Spiritual
    • Parabola
    • Transcendental Meditation
    • The Episcopal Diocese of Virginia
    • Southern Baptist
  • eJournal
  • Legals
  • Classifieds
  • Contact Us
  • My Account
  • Login
  • FAQ
No Result
View All Result
News Journal
No Result
View All Result
News Journal
No Result
View All Result

Building a safer future for AI research

Mountain Media, LLC by Mountain Media, LLC
January 13, 2026
in School, School
0
(From left) John Talerico, Lisa Lee, Qin Zhu, and Rocky Clancy have received support from the National Science Foundation to improve research security for artificial intelligence-related projects. Photo by Chelsea Seeber for Virginia Tech

Research with artificial intelligence (AI) has exploded over the last five years. In fiscal year 2025 alone, the National Science Foundation dedicated $2 billion to research and development for AI-related projects in an effort to emphasize U.S. leadership in this space.

While AI research has become a strategic national asset, critical to innovation, global competitiveness, and security, it also comes with increased vulnerability to espionage, misuse, and ethical misconduct.

Researchers at Virginia Tech have been awarded $300,000 by the National Science Foundation to tackle these concerns head on by building a more resilient, responsible, and secure AI research ecosystem.

A need for more secure research

Historically, concern for secure research centered on military technologies or commercially sensitive innovations.

“There’s now a concern with the entire research life cycle, especially with emerging technologies like AI and biotechnologies, in a way that there just wasn’t before,” said Rockwell Clancy, research scientist in the Department of Engineering Education. “The risks of stolen intellectual property can happen during data collection, while co-developing models with international collaborations, throughout evaluation and publication, or even in routine conversations about project progress.”

The need for additional systems and protection for research is driven by several factors:

Other countries can move research discoveries to applied technology rapidly, causing concern that the U.S. may be falling behind.

There’s been an uptick in cases involving intellectual property diversion or illicit technology transfer.

The sensitive data, models, and methods used in AI research can be exploited long before the research is completed, giving other countries access to proprietary information.

Federal agencies acknowledge that existing standards don’t adequately address vulnerabilities.

Creating tools to protect

While many universities are talking about research security, few are producing evidence-based tools that faculty can use as part of their daily work.

Additionally, most national efforts are still conceptual: policy papers, high-level guidance, and broad discussions about foreign influence or data protections. Federal agencies are asking for discipline-specific, actionable training that helps researchers understand what threats look like in their own fields.

“Our team here at Virginia Tech is one of the few groups developing evidence-based, scenario tools to help researchers understand and determine what threats across the AI research life cycle look like,” said Qin Zhu, associate professor of engineering education and principal investigator.

To create these tools, the team will interview and survey various stakeholders in the community of research security, including professionals doing AI research, to learn more about the security threats they’ve witnessed firsthand. From those data, the team will build fictional but realistic scenarios that mimic breaches or misconduct throughout various stages of the research life cycle. Once refined, the team plans to package these tools into an accessible digital tool kit to help universities, funding agencies, and industry partners better recognize and respond to risks.

“Our ultimate goal is to show our industry partners and funding agencies that we are knowledgeable and care deeply about secure research,” said John Talerico, assistant vice president for research security. “We want to be able to say, ‘Come sponsor your research here at Virginia Tech. Your work is safe with us.’”

 

Virginia Tech

Sign up to our newsletters

Enter your email address to join our newsletters.

You will receive a confirmation email for your subscription. Please check your inbox and spam folder to complete the confirmation process.
Some fields are missing or incorrect!
Lists
Previous Post

Blacksburg High School students wins Ninth District 2025 Congressional App Challenge

Next Post

Christian scores 1,000th point

Next Post
Christian scores 1,000th point

Christian scores 1,000th point

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

News Journal

Navigate Site

  • News
  • Obituaries
  • Opinion
  • Spiritual
  • eJournal
  • Legals
  • Classifieds
  • Contact Us
  • My Account
  • Login
  • FAQ

Follow Us

  • Login
Forgot Password?
Lost your password? Please enter your username or email address. You will receive a link to create a new password via email.
body::-webkit-scrollbar { width: 7px; } body::-webkit-scrollbar-track { border-radius: 10px; background: #f0f0f0; } body::-webkit-scrollbar-thumb { border-radius: 50px; background: #dfdbdb }
No Result
View All Result
  • News
    • Local
    • Sports
    • School
    • Courthouse
      • Deeds
  • Obituaries
  • Opinion
  • Spiritual
    • Parabola
    • Transcendental Meditation
    • The Episcopal Diocese of Virginia
    • Southern Baptist
  • eJournal
  • Legals
  • Classifieds
  • Contact Us
  • My Account
  • Login
  • FAQ